Inequalities for Shannon Entropy and Kolmogorov Complexity

نویسندگان

  • Daniel Hammer
  • Andrei E. Romashchenko
  • Alexander Shen
  • Nikolai K. Vereshchagin
چکیده

It was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory 14, 662 664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmogorov complexity. It turns out that (1) all linear inequalities that are valid for Kolmogorov complexity are also valid for Shannon entropy and vice versa; (2) all linear inequalities that are valid for Shannon entropy are valid for ranks of finite subsets of linear spaces; (3) the opposite statement is not true; Ingleton's inequality (1971, ``Combinatorial Mathematics and Its Applications,'' pp. 149 167. Academic Press, San Diego) is valid for ranks but not for Shannon entropy; (4) for some special cases all three classes of inequalities coincide and have simple description. We present an inequality for Kolmogorov complexity that implies Ingleton's inequality for ranks; another application of this inequality is a new simple proof of one of Ga cs Ko rner's results on common information (1973, Problems Control Inform. Theory 2, 149 162). 2000 Academic Press doi:10.1006 jcss.1999.1677, available online at http: www.idealibrary.com on

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inequalities for Shannon entropies and Kolmogorov complexities

The paper investigates connections between linear inequalities that are valid for Shannon entropies and for Kolmogorov complexities.

متن کامل

Facets of Entropy

Constraints on the entropy function are of fundamental importance in information theory. For a long time, the polymatroidal axioms, or equivalently the nonnegativity of the Shannon information measures, are the only known constraints. Inequalities that are implied by nonnegativity of the Shannon information measures are categorically referred to as Shannontype inequalities. If the number of ran...

متن کامل

Recent Progresses in Characterising Information Inequalities

In this paper, we present a revision on some of the recent progresses made in characterising and understanding information inequalities, which are the fundamental physical laws in communications and compression. We will begin with the introduction of a geometric framework for information inequalities, followed by the first non-Shannon inequality proved by Zhang et al. in 1998 [1]. The discovery...

متن کامل

Upcrossing Inequalities for Stationary Sequences and Applications to Entropy and Complexity

An empirical statistic for a class C of stationary processes is a function g which assigns to each process (Xn) ∈ C with distribution P and to each sample X1 . . . Xn of the process a real number gP (X1, . . . , Xn). We describe a condition on g which implies that the sequence (gP (X1 . . . Xn)) ∞ n=1 obeys a (universal) upcrossing inequality, that is, that the probability that this sequence fl...

متن کامل

Shannon Entropy vs. Kolmogorov Complexity

Most assertions involving Shannon entropy have their Kolmogorov complexity counterparts. A general theorem of Romashchenko [4] states that every information inequality that is valid in Shannon’s theory is also valid in Kolmogorov’s theory, and vice verse. In this paper we prove that this is no longer true for ∀∃-assertions, exhibiting the first example where the formal analogy between Shannon e...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • J. Comput. Syst. Sci.

دوره 60  شماره 

صفحات  -

تاریخ انتشار 2000